Search results for "Parameter Selection"

showing 6 items of 6 documents

Automation of Optimized Gabor Filter Parameter Selection for Road Cracks Detection

2016

International audience; Automated systems for road crack detection are extremely important in road maintenance for vehicle safety and traveler's comfort. Emerging cracks in roads need to be detected and accordingly repaired as early as possible to avoid further damage thus reducing rehabilitation cost. In this paper, a robust method for Gabor filter parameters optimization for automatic road crack detection is discussed. Gabor filter has been used in previous literature for similar applications. However, there is a need for automatic selection of optimized Gabor filter parameters due to variation in texture of roads and cracks. The problem of change of background, which in fact is road text…

Automated detectionGenetic Algorithm[ INFO ] Computer Science [cs]:Pavement Cracks[INFO]Computer Science [cs]ComputingMethodologies_GENERAL[INFO] Computer Science [cs]Gabor FiltersParameter Selection
researchProduct

Tuning parameter selection in LASSO regression

2016

We propose a new method to select the tuning parameter in lasso regression. Unlike the previous proposals, the method is iterative and thus it is particularly efficient when multiple tuning parameters have to be selected. The method also applies to more general regression frameworks, such as generalized linear models with non-normal responses. Simulation studies show our proposal performs well, and most of times, better when compared with the traditional Bayesian Information Criterion and Cross validation.

GCVBICSchall algorithmtuning parameter selection; lasso; GCV; BIC; CV; Schall algorithmtuning parameter selectionCVlassoSettore SECS-S/01 - Statistica
researchProduct

Penalized regression and clustering in high-dimensional data

The main goal of this Thesis is to describe numerous statistical techniques that deal with high-dimensional genomic data. The Thesis begins with a review of the literature on penalized regression models, with particular attention to least absolute shrinkage and selection operator (LASSO) or L1-penalty methods. L1 logistic/multinomial regression models are used for variable selection and discriminant analysis with a binary/categorical response variable. The Thesis discusses and compares several methods that are commonly utilized in genetics, and introduces new strategies to select markers according to their informative content and to discriminate clusters by offering reduced panels for popul…

High-dimensional dataQuantile regression coefficients modelingTuning parameter selectionGenomic dataLasso regressionLasso regression; High-dimensional data; Genomic data; Tuning parameter selection; Quantile regression coefficients modeling; Curves clustering;Settore SECS-S/01 - StatisticaCurves clustering
researchProduct

ℓ1-Penalized Methods in High-Dimensional Gaussian Markov Random Fields

2016

In the last 20 years, we have witnessed the dramatic development of new data acquisition technologies allowing to collect massive amount of data with relatively low cost. is new feature leads Donoho to define the twenty-first century as the century of data. A major characteristic of this modern data set is that the number of measured variables is larger than the sample size; the word high-dimensional data analysis is referred to the statistical methods developed to make inference with this new kind of data. This chapter is devoted to the study of some of the most recent ℓ1-penalized methods proposed in the literature to make sparse inference in a Gaussian Markov random field (GMRF) defined …

Markov kernelMarkov random fieldMarkov chainComputer scienceStructured Graphical lassoVariable-order Markov model010103 numerical & computational mathematicsMarkov Random FieldMarkov model01 natural sciencesGaussian random field010104 statistics & probabilityHigh-Dimensional InferenceMarkov renewal processTuning Parameter SelectionMarkov propertyJoint Graphical lassoStatistical physics0101 mathematicsSettore SECS-S/01 - StatisticaGraphical lasso
researchProduct

P-spline quantile regression: a new algorithm for smoothing parameter selection

Smoothing parameter selectionP-splineQuantile regressionNon-parametric StatisticsSettore SECS-S/01 - StatisticaQuantile regression; P-spline; Smoothing parameter selection; Non-parametric Statistics
researchProduct

A penalized approach to covariate selection through quantile regression coefficient models

2019

The coefficients of a quantile regression model are one-to-one functions of the order of the quantile. In standard quantile regression (QR), different quantiles are estimated one at a time. Another possibility is to model the coefficient functions parametrically, an approach that is referred to as quantile regression coefficients modeling (QRCM). Compared with standard QR, the QRCM approach facilitates estimation, inference and interpretation of the results, and generates more efficient estimators. We designed a penalized method that can address the selection of covariates in this particular modelling framework. Unlike standard penalized quantile regression estimators, in which model selec…

Statistics and Probability05 social sciencesQuantile regression model01 natural sciencesQuantile regressionInspiratory capacity010104 statistics & probabilitypenalized quantile regression coefficients modelling (QRCM p )Lasso penalty0502 economics and businessCovariateStatisticsPenalized integrated loss minimization (PILM)tuning parameter selection0101 mathematicsStatistics Probability and UncertaintySelection (genetic algorithm)050205 econometrics MathematicsQuantile
researchProduct